Neural Reversible Steganography with Long Short-Term Memory

نویسندگان

چکیده

Deep learning has brought about a phenomenal paradigm shift in digital steganography. However, there is as yet no consensus on the use of deep neural networks reversible steganography, class steganographic methods that permits distortion caused by message embedding to be removed. The underdevelopment field steganography with can attributed perception perfect reversal seems scarcely achievable, due lack transparency and interpretability networks. Rather than employing coding module scheme, we instead apply them an analytics exploits data redundancy maximise capacity. State-of-the-art schemes for images are based primarily histogram-shifting method which often modelled pixel intensity predictor. In this paper, propose refine prior estimation from conventional linear predictor through network model. refinement some extent viewed low-level vision task (e.g., noise reduction super-resolution imaging). way, explore leading-edge neuroscience-inspired model long short-term memory brief discussion its biological plausibility. Experimental results demonstrated significant boost contributed terms prediction accuracy rate-distortion performance.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Stacked Long Short-term Memory Neural Networks

In this paper, we describe a novel approach to generate conference call-for-papers using Natural Language Processing and Long Short-Term Memory network. The approach has been successfully evaluated on a publicly available dataset.

متن کامل

the effects of keyword and context methods on pronunciation and receptive/ productive vocabulary of low-intermediate iranian efl learners: short-term and long-term memory in focus

از گذشته تا کنون، تحقیقات بسیاری صورت گرفته است که همگی به گونه ای بر مثمر ثمر بودن استفاده از استراتژی های یادگیری لغت در یک زبان بیگانه اذعان داشته اند. این تحقیق به بررسی تاثیر دو روش مختلف آموزش واژگان انگلیسی (کلیدی و بافتی) بر تلفظ و دانش لغوی فراگیران ایرانی زیر متوسط زبان انگلیسی و بر ماندگاری آن در حافظه می پردازد. به این منظور، تعداد شصت نفر از زبان آموزان ایرانی هشت تا چهارده ساله با...

15 صفحه اول

Long Short-term Memory

Model compression is significant for the wide adoption of Recurrent Neural Networks (RNNs) in both user devices possessing limited resources and business clusters requiring quick responses to large-scale service requests. This work aims to learn structurally-sparse Long Short-Term Memory (LSTM) by reducing the sizes of basic structures within LSTM units, including input updates, gates, hidden s...

متن کامل

Long Short-Term Memory

Learning to store information over extended time intervals by recurrent backpropagation takes a very long time, mostly because of insufficient, decaying error backflow. We briefly review Hochreiter's (1991) analysis of this problem, then address it by introducing a novel, efficient, gradient-based method called long short-term memory (LSTM). Truncating the gradient where this does not do harm, ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Security and Communication Networks

سال: 2021

ISSN: ['1939-0122', '1939-0114']

DOI: https://doi.org/10.1155/2021/5580272